51 research outputs found

    A clustering heuristic to improve a derivative-free algorithm for nonsmooth optimization

    Get PDF
    In this paper we propose an heuristic to improve the performances of the recently proposed derivative-free method for nonsmooth optimization CS-DFN. The heuristic is based on a clustering-type technique to compute an estimate of Clarke’s generalized gradient of the objective function, obtained via calculation of the (approximate) directional derivative along a certain set of directions. A search direction is then calculated by applying a nonsmooth Newton-type approach. As such, this direction (as it is shown by the numerical experiments) is a good descent direction for the objective function. We report some numerical results and comparison with the original CS-DFN method to show the utility of the proposed improvement on a set of well-known test problems

    Splitting Metrics Diagonal Bundle Method for Large-Scale Nonconvex and Nonsmooth Optimization

    Get PDF
    Nonsmooth optimization is traditionally based on convex analysis and most solution methods rely strongly on the convexity of the problem. In this paper, we propose an efficient diagonal bundle method for nonconvex large-scale nonsmooth optimization. The novelty of the new method is in different usage of metrics depending on the convex or concave behaviour of the objective at the current iteration point. The usage of different metrics gives us a possibility to better deal with the nonconvexity of the problem than the sole — the most commonly used and quite arbitrary — downward shifting of the piecewise linear model does. The convergence of the proposed method is proved for semismooth functions that are not necessary differentiable nor convex. The numerical experiments have been made using problems with up to million variables. The results to be presented confirm the usability of the new method.</p

    Polyhedral separation via difference of convex (DC) programming

    Get PDF
    We consider polyhedral separation of sets as a possible tool in supervised classification. In particular, we focus on the optimization model introduced by Astorino and Gaudioso (J Optim Theory Appl 112(2):265–293, 2002) and adopt its reformulation in difference of convex (DC) form. We tackle the problem by adapting the algorithm for DC programming known as DCA. We present the results of the implementation of DCA on a number of benchmark classification datasets

    Least Square Regression Method for Estimating Gas Concentration in an Electronic Nose System

    Get PDF
    We describe an Electronic Nose (ENose) system which is able to identify the type of analyte and to estimate its concentration. The system consists of seven sensors, five of them being gas sensors (supplied with different heater voltage values), the remainder being a temperature and a humidity sensor, respectively. To identify a new analyte sample and then to estimate its concentration, we use both some machine learning techniques and the least square regression principle. In fact, we apply two different training models; the first one is based on the Support Vector Machine (SVM) approach and is aimed at teaching the system how to discriminate among different gases, while the second one uses the least squares regression approach to predict the concentration of each type of analyte

    Gradient set splitting in nonconvex nonsmooth numerical optimization

    No full text
    We present a numerical bundle-type method for local minimization of a real function of several variables, which is supposed to be locally Lipschitz. We provide a short survey of some optimization algorithms from the literature, which are able to deal with both nonsmoothness and nonconvexity of the objective function. We focus on possible extensions of classical bundle-type methods, originally conceived to deal with convex nonsmooth optimization. They are all based on a convex cutting plane model which has the property of both minorizing everywhere and interpolating at certain points the objective function. Such properties may be lost whenever nonconvexity is present and the case may be described in terms of possible negative values of certain linearization errors. We describe some alternative ways the problem is dealt with in the literature. Here, on the basis of a classification of the limit points of gradient sequences, we define two distinct cutting plane approximations. We derive an algorithm which uses both such models. In particular, only the convex model is primarily adopted to find a tentative displacement from the current stability centre, while the concave one enters into the play only when the convex model has failed in providing a sufficient decrease step. Termination of the method is proved and the results of some numerical experiments are reported

    A view of operations research applications in Italy, 2018

    No full text
    This book presents expert descriptions of the successful application of operations research in both the private and the public sector, including in logistics, transportation, product design, production planning and scheduling, and areas of social interest. Each chapter is based on fruitful collaboration between researchers and companies, and company representatives are among the co-authors. The book derives from a 2017 call by the Italian Operations Research Society (AIRO) for information from members on their activities in promoting the use of quantitative techniques, and in particular operations research techniques, in society and industry. A booklet based on this call was issued for the annual AIRO conference, but it was felt that some of the content was of such interest that it deserved wider dissemination in more detailed form. This book is the outcome. It equips practitioners with solutions to real-life decision problems, offers researchers examples of the practical application of operations research methods, and provides Master’s and PhD students with suggestions for research development in various fields

    A Dual Ascent Approach to the Bounded-Degree Spanning Tree Problem

    No full text
    Given a connected graph G a vertex is said to be of the branch type if its degree is greater than 2. We consider the problem of nding a spanning tree of G which minimizes the number of branch vertices. Such a problem has been proved to be NP-complete, and some efcient heuristics to solve it have been proposed in the literature. In the paper we present a new heuristic algorithm based on solving the Lagrangean dual of the original mixed integer programming problem by means of a dual ascent procedure requiring update of one multiplier at a time

    Application-motivated Nonlinear Programming

    No full text

    An illumination problem with tradeoff between coverage of a dataset and aperture angle of a conic light beam

    No full text
    International audienc
    • …
    corecore